Joint Independent Subspace Analysis: A Quasi-Newton Algorithm

نویسندگان

  • Dana Lahat
  • Christian Jutten
چکیده

In this paper, we present a quasi-Newton (QN) algorithm for joint independent subspace analysis (JISA). JISA is a recently proposed generalization of independent vector analysis (IVA). JISA extends classical blind source separation (BSS) to jointly resolve several BSS problems by exploiting statistical dependence between latent sources across mixtures, as well as relaxing the assumption of statistical independence within each mixture. Algebraically, JISA based on second-order statistics amounts to coupled block diagonalization of a set of covariance and crosscovariance matrices, as well as block diagonalization of a single permuted covariance matrix. The proposed QN algorithm achieves asymptotically the minimal mean square error (MMSE) in the separation of multidimensional Gaussian components. Numerical experiments demonstrate convergence and source separation properties of the proposed algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Memory Quasi - Newton Algorithm forLarge - Scale Nonlinear Bound Constrained

In this paper we propose a subspace limited memory quasi-Newton method for solving large-scale optimization with simple bounds on the variables. The limited memory quasi-Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. The search direction consists of three parts: a subspace quasi-Ne...

متن کامل

A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization

In this paper we propose a subspace limited memory quasi-Newton method for solving large-scale optimization with simple bounds on the variables. The limited memory quasi-Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. The search direction consists of three parts: a subspace quasi-Ne...

متن کامل

Fast large-scale optimization by unifying stochastic gradient and quasi-Newton methods

We present an algorithm for minimizing a sum of functions that combines the computational efficiency of stochastic gradient descent (SGD) with the second order curvature information leveraged by quasi-Newton methods. We unify these disparate approaches by maintaining an independent Hessian approximation for each contributing function in the sum. We maintain computational tractability and limit ...

متن کامل

A Dual Approach to Semidefinite Least-Squares Problems

In this paper, we study the projection onto the intersection of an affine subspace and a convex set and provide a particular treatment for the cone of positive semidefinite matrices. Among applications of this problem is the calibration of covariance matrices. We propose a Lagrangian dualization of this least-squares problem, which leads us to a convex differentiable dual problem. We propose to...

متن کامل

Quasi-Newton Methods for Nonconvex Constrained Multiobjective Optimization

Here, a quasi-Newton algorithm for constrained multiobjective optimization is proposed. Under suitable assumptions, global convergence of the algorithm is established.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015